Directional PointNet: 3D Environmental Classification for Wearable Robots
DOI:
Author:
Affiliation:

Department of Mechanical and Energy Engineering, Southern University of Science and Technology, Shenzhen 518055;
Department of Mechanical Engineering, The University of British Columbia, Vancouver V6T1Z4

Clc Number:

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    A subject who wears a suitable robotic device will be able to walk in complex environments with the aid of environmental recognition schemes that provide reliable prior information of the human motion intent. Researchers have utilized 1D laser signals and 2D depth images to classify environments, but those approaches can face the problems of self-occlusion. In comparison, 3D point cloud is more appropriate for depicting the environments. This paper proposes a directional PointNet to directly classify the 3D point cloud. First, an inertial measurement unit (IMU) is used to offset the orientation of point cloud. Then the directional PointNet can accurately classify the daily commuted terrains, including level ground, climbing up stairways, and walking down stairs. A classification accuracy of 98% has been achieved in tests. Moreover, the directional PointNet is more efficient than the previously used PointNet because the T-net, which is utilized to estimate the transformation of the point cloud, is not used in the present approach, and the length of the global feature is optimized. The experimental results demonstrate that the directional PointNet can classify the environments in robust and efficient manner.

    Reference
    Related
    Cited by
Get Citation

Kuangen ZHANG, Jing WANG, Chenglong FU.[J]. Instrumentation,2019,6(1):25-33

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: October 29,2020
  • Published:
License
  • Copyright (c) 2023 by the authors. This work is licensed under a Creative
  • Creative Commons Attribution-ShareAlike 4.0 International License.